Exact and Approximate Methods for Computing the Hessian of a Feedforward Artificial Neural Network

نویسندگان

  • Craig W. Codrington
  • Manoel F. Tenorio
چکیده

We present two optimization techniques based on cubic curve fitting; one based on function values and derivatives a t two previous points, the other based on derivatives a t three previous points. The latter approach is viewed from a derivative space perspective, obviating the need to compute the vertical translation of the cubic, thus simplifying the fitting problem. We dieinonstrate the effectiveness of the second method in training neural networks on parity problems of various sizes, and compare our results to a modified Quickprop algorithm and to gradient descent.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Hybrid Neural Network Approach for Kinematic Modeling of a Novel 6-UPS Parallel Human-Like Mastication Robot

Introduction we aimed to introduce a 6-universal-prismatic-spherical (UPS) parallel mechanism for the human jaw motion and theoretically evaluate its kinematic problem. We proposed a strategy to provide a fast and accurate solution to the kinematic problem. The proposed strategy could accelerate the process of solution-finding for the direct kinematic problem by reducing the number of required ...

متن کامل

Application of Two Methods of Artificial Neural Network MLP, RBF for Estimation of Wind of Sediments (Case Study: Korsya of Darab Plain)

The lack of sediment gauging stations in the process of wind erosion, caused of estimate of sediment be process of necessary and important. Artificial neural networks can be used as an efficient and effective of tool to estimate and simulate sediments. In this paper two model multi-layer perceptron neural networks and radial neural network was used to estimate the amount of sediment in Korsya o...

متن کامل

Feedforward and Recurrent Neural Networks Backward Propagation and Hessian in Matrix Form

In this paper we focus on the linear algebra theory behind feedforward (FNN) and recurrent (RNN) neural networks. We review backward propagation, including backward propagation through time (BPTT). Also, we obtain a new exact expression for Hessian, which represents second order effects. We show that for t time steps the weight gradient can be expressed as a rank-t matrix, while the weight Hess...

متن کامل

Towards a Mathematical Understanding of the Difficulty in Learning with Feedforward Neural Networks

Despite the recent success of deep neural networks in various applications, designing and training deep neural networks is still among the greatest challenges in the field. In this work, we address the challenge of designing and training feedforward Multilayer Perceptrons (MLPs) from a smooth optimisation perspective. By characterising the critical point conditions of an MLP based loss function...

متن کامل

Exact Hessian Calculation in Feedforward FIR Neural Networks

FIR neural networks are feedforward neural networks with regular scalar synapses replaced by linear finite impulse response filters. This paper introduces the Second Order Temporal Backpropagation algorithm which enables the exact calculation of the second order error derivatives for a FIR neural network. This method is based on the error gradient calculation method first proposed by Wan and re...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2013